SELF-SCALING VARIABLE METRIC (SSVM) ALGORITHMS Part I: Criteria and Sufficient Conditions for Scaling a Class of Algorithms*t

نویسندگان

  • SHMUEL S. OREN
  • DAVID G. LUENBERGER
چکیده

A new criterion is introduced for comparing the convergence properties of variable metric algorithms, focusing on stepwise descent properties. This criterion is a bound on the rate of decrease in the function value at each iterative step (single-step convergence rate). Using this criterion as a basis for algorithm development leads to the introduction of variable coefficients to rescale the objective function at each iteration, and, correspondingly, to a new class of variable metric algorithms. Effective scaling can be implemented by restricting the parameters in a twoparameter family of variable metric algorithms. Conditioins are derived for these parameters that guarantee monotonic improvement in the single-step convergence rate. These conditions are obtained by analyzing the eigenvalue structure of the associated inverse Hessian approximations.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

SELF-SCALING VARIABLE METRIC (SSVM) ALGORITHMS Part II: Implementation and Experiments*t

This part of the paper introduLces some possible implementations of Self-Scaling Variable Metric algorithms based oIn the theory presented in Part I. These implementations are analyzed theoretically aind discussed qualitatively. A special class of SSVM algorithms is introduced, which has the additional property of being invariant under scaliing of the objective function or of the variables. Exp...

متن کامل

Perspectives on Self - Scaling Variable Metric Algorithms

Recent attempts to assess the performance of SSVM algorithms for unconstrained minimization problems differ in their evaluations from earlier assessments. Nevertheless, the new experiments confirm earlier observations that, on certain types of problems, the SSVM algorithms are far superior to other variable metric methods. This paper presents a critical review of these recent assessments and di...

متن کامل

Optimal conditioning of self-scaling variable Metric algorithms

Variable Metric Methods are "Newton-Raphson-like" algorithms for unconstrained minimization in which the inverse Hessian is replaced by an approximation, inferred from previous gradients and updated at each iteration, During the past decade various approaches have been used to derive general classes of such algorithms having the common properties of being Conjugate Directions methods and having...

متن کامل

A Variable-Metric Method for Function Minimization Derived from Invariancy to Nonlinear Scaling

The effect of nonlinearly scaling the objective function on the variable-metric method is investigated, and Broyden's update is modified so that a property of invariancy to the scaling is satisfied. A new three-parameter class of updates is generated, and criteria for an optimal choice of the parameters are given, Numerical experiments compare the performance of a number of algorithms of the re...

متن کامل

Self-Scaling Variable Metric Algorithms Without Line Search for Unconstrained Minimization*

This paper introduces a new class of quasi-Newton algorithms for unconstrained minimization in which no line search is necessary and the inverse Hessian approximations are positive definite. These algorithms are based on a two-parameter family of rank two, updating formulae used earlier with line search in self-scaling variable metric algorithms. It is proved that, in a quadratic case, the new ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007